502 research outputs found

    Scalable Semidefinite Programming using Convex Perturbations

    Get PDF
    Several important machine learning problems can be modeled and solved via semidefinite programs. Often, researchers invoke off-the-shelf software for the associated optimization, which can be inappropriate for many applications due to computational and storage requirements. In this paper, we introduce the use of convex perturbations for semidefinite programs (SDPs). Using a particular perturbation function, we arrive at an algorithm for SDPs that has several advantages over existing techniques: a) it is simple, requiring only a few lines of MATLAB, b) it is a first-order method which makes it scalable, c) it can easily exploit the structure of a particular SDP to gain efficiency (e.g., when the constraint matrices are low-rank). We demonstrate on several machine learning applications that the proposed algorithm is effective in finding fast approximations to large-scale SDPs

    Expanding the Family of Grassmannian Kernels: An Embedding Perspective

    Full text link
    Modeling videos and image-sets as linear subspaces has proven beneficial for many visual recognition tasks. However, it also incurs challenges arising from the fact that linear subspaces do not obey Euclidean geometry, but lie on a special type of Riemannian manifolds known as Grassmannian. To leverage the techniques developed for Euclidean spaces (e.g, support vector machines) with subspaces, several recent studies have proposed to embed the Grassmannian into a Hilbert space by making use of a positive definite kernel. Unfortunately, only two Grassmannian kernels are known, none of which -as we will show- is universal, which limits their ability to approximate a target function arbitrarily well. Here, we introduce several positive definite Grassmannian kernels, including universal ones, and demonstrate their superiority over previously-known kernels in various tasks, such as classification, clustering, sparse coding and hashing

    In-Space Propulsion, Logistics Reduction, and Evaluation of Steam Reformer Kinetics: Problems and Prospects

    Get PDF
    Human space missions generate waste materials. A 70-kg crewmember creates a waste stream of 1 kg per day, and a four-person crew on a deep space habitat for a 400+ day mission would create over 1600 kg of waste. Converted into methane, the carbon could be used as a fuel for propulsion or power. The NASA Advanced Exploration Systems (AES) Logistics Reduction and Repurposing (LRR) project is investing in space resource utilization with an emphasis on repurposing logistics materials for useful purposes and has selected steam reforming among many different competitive processes as the preferred method for repurposing organic waste into methane. Already demonstrated at the relevant processing rate of 5.4 kg of waste per day, high temperature oxygenated steam consumes waste and produces carbon dioxide, carbon monoxide, and hydrogen which can then be converted into methane catalytically. However, the steam reforming process has not been studied in microgravity. Data are critically needed to understand the mechanisms that allow use of steam reforming in a reduced gravity environment. This paper reviews the relevant literature, identifies gravity-dependent mechanisms within the steam gasification process, and describes an innovative experiment to acquire the crucial kinetic information in a small-scale reactor specifically designed to operate within the requirements of a reduced gravity aircraft flight. The experiment will determine if the steam reformer process is mass-transport limited, and if so, what level of forced convection will be needed to obtain performance comparable to that in 1-g

    Variational Deep Semantic Hashing for Text Documents

    Full text link
    As the amount of textual data has been rapidly increasing over the past decade, efficient similarity search methods have become a crucial component of large-scale information retrieval systems. A popular strategy is to represent original data samples by compact binary codes through hashing. A spectrum of machine learning methods have been utilized, but they often lack expressiveness and flexibility in modeling to learn effective representations. The recent advances of deep learning in a wide range of applications has demonstrated its capability to learn robust and powerful feature representations for complex data. Especially, deep generative models naturally combine the expressiveness of probabilistic generative models with the high capacity of deep neural networks, which is very suitable for text modeling. However, little work has leveraged the recent progress in deep learning for text hashing. In this paper, we propose a series of novel deep document generative models for text hashing. The first proposed model is unsupervised while the second one is supervised by utilizing document labels/tags for hashing. The third model further considers document-specific factors that affect the generation of words. The probabilistic generative formulation of the proposed models provides a principled framework for model extension, uncertainty estimation, simulation, and interpretability. Based on variational inference and reparameterization, the proposed models can be interpreted as encoder-decoder deep neural networks and thus they are capable of learning complex nonlinear distributed representations of the original documents. We conduct a comprehensive set of experiments on four public testbeds. The experimental results have demonstrated the effectiveness of the proposed supervised learning models for text hashing.Comment: 11 pages, 4 figure

    Robust Trajectory Planning for Autonomous Parafoils under Wind Uncertainty

    Get PDF
    A key challenge facing modern airborne delivery systems, such as parafoils, is the ability to accurately and consistently deliver supplies into di cult, complex terrain. Robustness is a primary concern, given that environmental wind disturbances are often highly uncertain and time-varying, coupled with under-actuated dynamics and potentially narrow drop zones. This paper presents a new on-line trajectory planning algorithm that enables a large, autonomous parafoil to robustly execute collision avoidance and precision landing on mapped terrain, even with signi cant wind uncertainties. This algorithm is designed to handle arbitrary initial altitudes, approach geometries, and terrain surfaces, and is robust to wind disturbances which may be highly dynamic throughout the terminal approach. Explicit, real-time wind modeling and classi cation is used to anticipate future disturbances, while a novel uncertainty-sampling technique ensures that robustness to possible future variation is e ciently maintained. The designed cost-to-go function enables selection of partial paths which intelligently trade o between current and reachable future states. Simulation results demonstrate that the proposed algorithm reduces the worst-case impact of wind disturbances relative to state-of-the-art approaches.Charles Stark Draper Laborator

    Green Aerospace Fuels from Nonpetroleum Sources

    Get PDF
    Efforts to produce green aerospace propellants from nonpetroleum sources are outlined. The paper begins with an overview of feedstock processing and relevant small molecule or C1 chemistry. Gas-to-liquid technologies, notably Fischer-Tropsch (FT) processing of synthesis gas (CO and H2), are being optimized to enhance the fraction of product stream relevant to aviation (and other transportation) fuels at the NASA Glenn Research Center (GRC). Efforts to produce optimized catalysts are described. Given the high cost of space launch, the recycling of human metabolic and plastic wastes to reduce the need to transport consumables to orbit to support the crew of a space station has long been recognized as a high priority. If the much larger costs of transporting consumables to the Moon or beyond are taken into account, the importance of developing waste recycling systems becomes still more imperative. One promising way to transform organic waste products into useful gases is steam reformation; this well-known technology is currently being optimized by a Colorado company for exploration and planetary surface operations. Reduction of terrestrial waste streams while producing energy and/or valuable raw materials is an opportunity being realized by a new generation of visionary entrepreneurs. A technology that has successfully demonstrated production of fuels and related chemicals from waste plastics developed in Northeast Ohio is described. Technologies being developed by a Massachusetts company to remove sulfur impurities are highlighted. Common issues and concerns for nonpetroleum fuel production are emphasized. Energy utilization is a concern for production of fuels whether a terrestrial operation or on the lunar (or Martian) surface; the term green relates to not only mitigating excess carbon release but also to the efficiency of grid-energy usage. For space exploration, energy efficiency can be an essential concern. Other issues of great concern include minimizing impurities in the product stream(s), especially those that potential health risks and/or could degrade operations through catalyst poisoning or equipment damage. The potential impacts on future missions by such concerns are addressed in closing

    Microstructural and Mechanical Characterization of a Dispersion Strengthened Medium Entropy Alloy Produced Using Selective Laser Melting

    Get PDF
    High entropy alloys (HEAs) are an interesting new class of alloys which have been shown to exhibit both notable strength and ductility for a wide range of temperature and stresses. In addition, the remarkably small difference between the solvus and liquidus temperatures for many face centered cubic HEAs makes them an excellent candidate for selective laser melting fabrication. In this study, the microstructure and mechanical properties of a dispersion strengthened equiatomic NiCoCr alloy successfully produced using selective laser melting are explored. The effect laser speed, laser power, and powder recyclability have on final part density and microstructural segregation are analyzed through both x-ray diffraction and high resolution scanning electron microscopy. These results are further validated and compared to stable phase predictions produced using a commercially available high entropy alloy mobility database. Lastly, the tensile strengths resulting from different heat treatment pathways are detailed

    Measurement of shower development and its Moli\`ere radius with a four-plane LumiCal test set-up

    Get PDF
    A prototype of a luminometer, designed for a future e+e- collider detector, and consisting at present of a four-plane module, was tested in the CERN PS accelerator T9 beam. The objective of this beam test was to demonstrate a multi-plane tungsten/silicon operation, to study the development of the electromagnetic shower and to compare it with MC simulations. The Moli\`ere radius has been determined to be 24.0 +/- 0.6 (stat.) +/- 1.5 (syst.) mm using a parametrization of the shower shape. Very good agreement was found between data and a detailed Geant4 simulation.Comment: Paper published in Eur. Phys. J., includes 25 figures and 3 Table

    Beyond Volume: The Impact of Complex Healthcare Data on the Machine Learning Pipeline

    Full text link
    From medical charts to national census, healthcare has traditionally operated under a paper-based paradigm. However, the past decade has marked a long and arduous transformation bringing healthcare into the digital age. Ranging from electronic health records, to digitized imaging and laboratory reports, to public health datasets, today, healthcare now generates an incredible amount of digital information. Such a wealth of data presents an exciting opportunity for integrated machine learning solutions to address problems across multiple facets of healthcare practice and administration. Unfortunately, the ability to derive accurate and informative insights requires more than the ability to execute machine learning models. Rather, a deeper understanding of the data on which the models are run is imperative for their success. While a significant effort has been undertaken to develop models able to process the volume of data obtained during the analysis of millions of digitalized patient records, it is important to remember that volume represents only one aspect of the data. In fact, drawing on data from an increasingly diverse set of sources, healthcare data presents an incredibly complex set of attributes that must be accounted for throughout the machine learning pipeline. This chapter focuses on highlighting such challenges, and is broken down into three distinct components, each representing a phase of the pipeline. We begin with attributes of the data accounted for during preprocessing, then move to considerations during model building, and end with challenges to the interpretation of model output. For each component, we present a discussion around data as it relates to the healthcare domain and offer insight into the challenges each may impose on the efficiency of machine learning techniques.Comment: Healthcare Informatics, Machine Learning, Knowledge Discovery: 20 Pages, 1 Figur

    Performance of fully instrumented detector planes of the forward calorimeter of a Linear Collider detector

    Get PDF
    Detector-plane prototypes of the very forward calorimetry of a future detector at an e+e- collider have been built and their performance was measured in an electron beam. The detector plane comprises silicon or GaAs pad sensors, dedicated front-end and ADC ASICs, and an FPGA for data concentration. Measurements of the signal-to-noise ratio and the response as a function of the position of the sensor are presented. A deconvolution method is successfully applied, and a comparison of the measured shower shape as a function of the absorber depth with a Monte-Carlo simulation is given.Comment: 25 pages, 32 figures, revised version following comments from referee
    corecore